![]() METHOD, SYSTEM AND DEVICE FOR MOBILE AUTOMATION DEVICE LOCALIZATION
专利摘要:
A method of mobile automation device localization in a navigation controller, the method comprising: controlling a depth sensor to record a plurality of depth measurements corresponding to an area comprising a navigation structure; selecting a primary subset of the depth measurements; selecting, from the primary subset, an angle candidate subset of the depth measurements; generating, from the corner candidate subset, a corner edge corresponding to the navigation structure; selecting a path subset of the depth measurements from the primary subset, corresponding to the angular edge; selecting, from the pad subset, a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generating a plank plane from the local minimum depth measurements; and updating a location of the mobile automation device based on the corner edge and the shelf plane. 公开号:BE1026161B1 申请号:E20195216 申请日:2019-04-03 公开日:2020-07-10 发明作者:fen Cao;Harsoveet Singh;Richard Jeffrey Rzeszutek;Jingxing Qian;Jonathan Kelly 申请人:Symbol Technologies Llc; IPC主号:
专利说明:
Method, system and equipment for mobile LOCALIZATION AUTOMATION DEVICE Background Environments in which objects need to be managed, such as retail facilities, can be complex and changeable. For example, a retail facility may include objects such as products for sale, a distribution environment may include objects such as packages or pallets, a manufacturing environment may include objects such as components or assemblies, a healthcare environment may include objects such as medicaments or medical devices. A mobile device can be used to perform tasks in the environment, such as recording data for use in identifying products that are sold out, incorrectly located, and the like. To move within the environment, a path is generated that extends from a start location to a destination location and the device travels down the path to the destination. To move accurately along the above path, the device typically follows its location within the environment. However, such location tracking (also referred to as localization) is subject to various sources of noise and anomalies, which can add up to a degree sufficient to affect navigational accuracy and hinder the execution of tasks by the device, such as data recording tasks. Resume According to an aspect of the invention, there is provided a method of mobile automation device localization in a navigation controller, the method comprising: controlling a BE2019 / 5216 depth sensor to record a multiple number of depth measurements corresponding to an area that includes a navigation structure; selecting a primary subset of the depth measurements; selecting, from the primary subset, an angle candidate subset of the depth measurements; generating, from the corner candidate subset, a corner edge corresponding to the navigation structure; selecting a path subset of the depth measurements from the primary subset, corresponding to the angular edge; selecting, from the pad subset, a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generating a plank plane from the local minimum depth measurements; and updating a location of the mobile automation device based on the corner edge and the shelf plane. Furthermore, prior to recording the depth measurements, the method may include: receiving an instruction to cross a path associated with the navigation structure; obtaining a location of the navigation structure in a global frame of reference; and controlling a moving mechanism from the mobile automation device to move to the site. In one aspect, selecting the primary subset may include generating a primary selection area centered on the depth sensor, and selecting the depth measurements within the primary selection area. The primary selection area is, for example, a cylinder. Selecting the pad subset may include dividing the primary subset into two sections corresponding to the corner edge and selecting one of the sections. Alternatively and / or additionally, updating the localization may include initializing a local frame of reference containing an origin BE2019 / 5216 based on the corner edge and the plank surface. The updated localization can be provided to a Kalman filter. The method may further include: recording image data with the depth measurements; detecting a shelf edge in the image data; and validating the shelf face according to the shelf edge. The method may further include: initiating a path crossing; controlling the depth sensor to record a further multiple number of depth measurements; selecting a further primary subset of depth measurements from the further multiple number of depth measurements; selecting a further path subset of the depth measurements from the further primary subset; generating a further plank plane based on the further path subset; and further updating the localization based on the further plank plane. Additionally, the method may further include: determining an angle of the further shelf plane relative to an attitude plane of the mobile automation device; and rejecting the further plank surface if the angle exceeds a threshold. According to a further aspect of the invention, a computing device for mobile automation device localization is provided, the computing device comprising: a depth sensor, a navigation controller adapted to: control the depth sensor to record a plurality of depth measurements corresponding to an area that has a navigation structure includes; select a primary subset of the depth measurements; selecting an angle candidate subset of the depth measurements from the primary subset; generate a corner edge corresponding to the navigation structure from the corner candidate subset; select a path subset of the depth measurements from the primary subset, corresponding to the angular edge; from the path subset a local BE2019 / 5216 to select minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generate a plank plane from the local minimum depth measurements; and update a location of the mobile automation device based on the corner edge and the shelf plane. Advantageously, prior to controlling the depth sensor to record the depth readings, the navigation controller is further arranged to receive an instruction to cross a path associated with the navigation structure; obtain a location of the navigation structure in a global frame of reference; and controlling a moving mechanism from the mobile automation device to move to the site. Alternatively and / or additionally, the navigator is further configured to select the primary subset by: generating a primary selection area centered on the depth sensor; and selecting the depth measurements within the primary selection area. The primary selection area is, for example, a cylinder. The navigator of the calculator may further be arranged to select the pad subset by dividing the primary subset into two sections corresponding to the corner edge, and selecting one of the sections. Alternatively and / or additionally, the navigation controller may be further adapted to update the localization by initializing a local reference frame that has an origin based on the corner edge and the plank plane. The navigator may further be arranged to provide the updated localization to a Kalman filter. The navigation controller is further arranged to: control the image sensor to record image data with the depth measurements; a plank edge Detect BE2019 / 5216 in the image data; and validate the shelf face in accordance with the shelf edge. Advantageously, the navigation controller is further arranged to: initiate a crossing of the path; control the depth sensor to record a further multiple number of depth readings; selecting a further primary subset of depth measurements from the further multiple number of depth measurements; selecting a further path subset of the depth measurements from the further primary subset; generate a further plank plane based on the further path subset; and further update the localization based on the further plank plane. Furthermore, the navigation controller can be arranged to: determine an angle of the further plank surface with respect to an attitude plane of the mobile automation device; and reject the further plank surface if the angle exceeds a threshold. Brief description of the various views of the drawings The accompanying figures, where like reference numerals refer to identical or functionally similar elements in the individual views, along with the figure description below, are incorporated into and form part of the description and serve to further illustrate embodiments of concepts encompassing the claimed invention and explain several principles and advantages of these embodiments. Herein shows FIG. 1 a sketch of a mobile automation system; FIG. 2A shows a mobile automation device in the system of FIG. 1; BE2019 / 5216 FIG. 2B is a block diagram of certain internal hardware components of the mobile automation device in the system of FIG. 1; FIG. 3 is a block diagram of certain internal components of the mobile automation device of FIG. 1; FIG. 4 is a flowchart of localization method for the mobile automation device of FIG. 1; FIG. 5 is a plan view of a path where the mobile automation device of FIG. 1 will move towards it; FIG. 6a partial plan view of the path of FIG. 5, which shows the accumulated localization deviation when the mobile automation device of FIG. 1 has reached the path; FIG. 7 is a perspective view of a portion of the path shown in FIG. 6; FIGS. 8A and 8B depth and image data captured by the mobile automation device of FIG. 1 during the execution of the method of FIG. 4; FIGS. 9A-9D an exemplary embodiment of blocks 410, 415 and 420 of the method of FIG. 4; FIGS 10A-10C an exemplary embodiment of blocks 425 and 430 of the method of FIG. 4; FIG. 11 an updated localization resulting from the implementation of the method of FIG. 4; FIG. 12 is a flowchart of another localization method for the mobile automation device of FIG. 1; FIG. 13 an exemplary embodiment of the method of FIG. 12, and FIG. 14 an updated localization resulting from the implementation of the method of FIG. 12. Elements in the figures are shown for simplicity and clarity and are not necessarily drawn to scale. For example, the dimensions of some of the elements in the figures may be BE2019 / 5216 has been exaggerated over other elements to help improve understanding of embodiments of the present invention. The device and method components are represented, where appropriate, by conventional symbols in the figures, showing only those specific details relevant to understanding the embodiments of the present invention so as not to obscure the description with details which are easily clear in themselves. Detailed description Examples disclosed herein are directed to a method for mobile automation device localization in a navigation controller, the method comprising: controlling a depth sensor to record a plurality of depth measurements corresponding to an area comprising a navigation structure; selecting a primary subset of the depth measurements; selecting, from the primary subset, an angle candidate subset of the depth measurements; generating, from the corner candidate subset, a corner edge corresponding to the navigation structure; selecting a path subset of the depth measurements from the primary subset, corresponding to the angular edge; selecting, from the pad subset, a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generating a plank plane from the local minimum depth measurements; and updating a location of the mobile automation device based on the corner edge and the shelf plane. Additional examples disclosed herein are directed to a mobile automation device localization calculator comprising: a depth sensor, a navigation controller adapted to: control the depth sensor to record a multiple number of depth measurements BE2019 / 5216 corresponding to an area that includes a navigation structure; select a primary subset of the depth measurements; selecting an angle candidate subset of the depth measurements from the primary subset; generate a corner edge corresponding to the navigation structure from the corner candidate subset; select a path subset of the depth measurements from the primary subset, corresponding to the corner edge; selecting from the pad subset a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generate a plank plane from the local minimum depth measurements; and update a location of the mobile automation device based on the corner edge and the shelf plane. FIG. 1 shows a mobile automation device 100 in accordance with the teachings of this specification. The system 100 includes a server 101 in communication with at least one mobile automation device 103 (also referred to herein simply as the device 103) and with at least one client calculator 105 via communication links 107, which in the present example are illustrated as comprising wireless connections. In the present example, connections 107 are provided by a wireless local area network (WLAN) deployed in the retail environment through one or more access points (not shown). In other examples, the server 101, the client device 105, or both are located outside the retail environment, and the connections 107 therefore include wide-area networks such as the Internet, mobile networks, and the like. The system 100 also includes a dock 108 for the device 103 in the present example. The dock 108 communicates with the server 101 through a connection 109 used in the present BE2019 / 5216 example is a wired connection. However, in other examples, the connection 109 is a wireless connection The client calculator 105 is shown in FIG. 1 as a mobile computing device, such as a tablet, smartphone or the like. In other examples, the client device 105 is implemented as a different type of computing device, such as a desktop computer, a laptop computer, another server, a kiosk, a monitor, and the like. The system 100 may include a plurality of client devices 105 in communication with the server 101 through the connections 107. The system 100 is deployed in the illustrated example in a retail environment that includes a plurality of shelf modules 110-1, 110-2, 110-3 and so on (collectively referred to as shelves 110, and commonly referred to as a shelf 110 - this naming is also used for other elements discussed herein). Each shelf module 110 supports a plurality of products 112. Each shelf module 110 includes a shelf backside 116-1, 116-2, 116-3 and a support surface (e.g., support surface 117-3 as shown in FIG. 1) that extends from the plank back 116 to a plank edge 118-3, 118-2, 118-3. The shelf modules 110 are typically arranged in a multiple number of paths, each of which includes a multiple number of module 110 aligned from end to end. In such arrangements, the shelf edges 118 face the paths, allowing customers in the retail environment as well as the device 103 to move. At each end of a pad, one of the modules 110 forms a pad end (endcap), with some of the shelf edges 118 of that module 110 not facing the pad, but outward from the end of the pad. In some examples (not shown), end piece structures are placed at the ends of the paths. The end piece structures may be additional shelf modules 110, such as one BE2019 / 5216 have reduced length relative to the modules 110 within the path, and are arranged perpendicular to the modules 110 within the path. As will be apparent from FIG. 1, the term "shelf edge" 118 as used herein, which may also be referred to as the edge of a support surface (e.g., the support surfaces 117), refers to a surface bounded by adjacent surfaces with different angles of inclination. In the example shown in FIG. 1, the shelf edge 118-3 is angled approximately ninety degrees from each of the support surfaces 117-3 and the bottom (not shown) of the support surface 1173. In other examples, the angles between the shelf edge 118-3 and the neighboring surfaces, such as the support surface 117-3, more or less than ninety degrees. As the skilled person realizes, a support surface is not limited to a shelf support surface. In one embodiment, a support surface can be, for example, a table support surface (e.g., a table top). In such an embodiment, a "shelf edge" and a "shelf surface" correspond, respectively, to an edge of a support surface, such as a table support surface and a surface that includes the edge of the table support surface. The device 103 is deployed in the retail environment, and communicates with the server 101 (e.g., through the connection 107) to navigate autonomously or partially autonomously along a length 119 of at least a portion of the shelves 110. The device 103 is arranged to navigate in the middle of the shelves 110, for example, according to a frame of reference 102 set within the retail environment. The frame of reference 102 can also be referred to as a global frame of reference. The device 103 is arranged to track the location of the device 103 relative to it during such navigation BE2019 / 5216 frame of reference 102. In other words, the device 103 is arranged to perform localization. As will be described in more detail below, the device 103 is also adapted to update the above localization by detecting certain structural features within the retail environment. The device 103 includes a plurality of navigation and data recording sensors 104, such as image sensors (e.g., one or more digital cameras) and depth sensors (e.g., one or more Light Detection and Range (LIDAR) sensors, one or more depth cameras using structured light patterns such as infrared light or the like). The device 103 may be arranged to use the sensors 104 to both navigate amid the shelves 110 and to record shelf data during such navigation. The server 101 includes a special purpose controller, such as a processor 120, which is specifically designed to control and / or assist the mobile automation device 103 to navigate the environment and record data. To this end, the server 101 is arranged to maintain in a memory 122 connected to the processors 120 a memory module 132 that includes data for use in navigation through the device 103. The processor 120 may further be arranged to obtain the captured data through a communication interface 124 for later processing (e.g., to detect objects such as shelf-mounted products in the captured data, and to detect status information corresponding to the objects). The server 101 may also be arranged to send status reports (e.g., reports that products are out of stock, out of stock, or misplaced) to the client device 105 in response to the determination of product status data. The client device 105 includes one or more controllers (e.g., central processing units (CPUs) and / or field BE2019 / 5216 programmable gate arrays (FPGAs) and the like) which are arranged to process messages (e.g. to display) received from the server 101. The processor 120 is interconnected to a non-perishable computer-readable storage medium, such as the aforementioned memory 122, which stores computer-readable instructions for performing various functionalities, including controlling the device 103 to navigate the modules 110, and record plank data, as well as post-processing of the plank data. Memory 122 includes a combination of volatile (e.g., Random Access Memory or RAM) and non-volatile memory (e.g., read only memory or ROM, Eletrically Erasable PRogrammable Read Only Memory or EEPROM, flash memory). The processor 120 and the memory 122 each comprise one or more integrated circuits. In some embodiments, the processor 120 is implemented as one or more central processing units (CPUs) and / or graphics processing units (GPUs). The server 101 also includes the aforementioned communication interface 124 which is interconnected with the processor 120. The communication interface 124 includes suitable hardware (e.g. transmitters, receivers, network interface controllers and the like) that allow the server 101 to communicate with other computing devices - in particular, device 103, client device 105 and dock 108 - through connections 107 and 109. Connections 107 and 109 may be direct connections or connections traversing one or more networks, including both local and wide-area networks are understood. The specific components of the communication interface 124 are selected based on the type of network or other connections over which the server 101 is to communicate. In the present example, as noted earlier, is a wireless local area network BE2019 / 5216 implemented in the retail environment by deploying one or more wireless access points. The connections 107 therefore include wireless connections between the device 103 and the mobile device 105 and the aforementioned access points, and / or a wired connection (e.g., an Ethernet-based connection) between the server 101 and the access point. Memory 122 stores a plurality of applications, each of which includes a plurality of computer-readable instructions that can be executed by processor 120. The execution of the aforementioned instructions by the processor 120 directs the server 101 to perform various operations discussed herein. The applications stored in memory 122 include a control application 128, which can also be implemented as a set (suite) of logically different applications. Generally, the processor 120 is arranged to implement various functionalities through execution of the application 128 or sub-components thereof and in conjunction with the other components of the server 101. The processor 120, as configured through the implementation of the control application 128, is also referred to herein as the controller 120. As will now be understood, some or all of the functionality implemented by the controller 120, described below, may also are executed by pre-provisioned hardware elements (for example one or more FPGAs and / or application specific integrated circuits (ASICs))) instead of execution of the control application 128 by the processor 120. With reference to FIGs. 2A and 2B, the mobile automation device 103 is now shown in more detail. The device 103 includes a chassis 201 which includes a moving mechanism 203 (for example, one or more electric motors driving the wheels, tracks or the like). The device 103 further includes a sensor mast 205 supported on the chassis 201 and in the present BE2019 / 5216 example extends upward (e.g. substantially vertically) from the chassis 201. The mast 205 supports the sensors 104 mentioned previously. In particular, sensors 104 include at least one image processing sensor 207, such as a digital camera, as well as at least one depth sensor 209, such as a 3D digital camera, capable of capturing both depth data and image data. Device 103 also includes additional depth sensors, such as LIDAR sensors 211. In other examples, device 103 includes additional sensors, such as one or more RFID readers, temperature sensors, and the like. In the present example, the mast 205 supports seven digital cameras 207-1 through 207-7 and two LIDAR sensors 211-1 and 211-2. The mast 205 also supports a plurality of illumination assemblies 213 arranged to illuminate the fields of view of the cameras 207. That is, the lighting assembly 213-1 illuminates the field of view of the camera 207-1, and so on. The sensors 207 and 211 are oriented on the mast such that the fields of view of each sensor face a board 110 along the length 119 along which the device 103 moves. The device 103 is arranged to track a location of the device 103 (for example, a location of the center of the chassis 201) in a common frame of reference previously set in the retail facility, allowing data captured by the mobile automation device is reported to the common frame of reference. The mobile automation device 103 includes a special purpose controller, such as a processor 220, as shown in FIG. 2B, which is interconnected with a non-perishable computer-readable storage medium, such as a memory 222. Memory 222 comprises a combination of volatile (for example, Random Access Memory or RAM) and non-volatile memory (for example, read only memory or ROM , Eletrically Erasable PRogrammable Read Only Memory or EEPROM, flash memory). BE2019 / 5216 The processor 220 and the memory 222 each comprise one or more integrated circuits. The memory 222 stores computer-readable instructions for execution by the processor 220. In particular, the memory 222 stores a control application 228 which, when executed by the processor 220, directs the processor 220 to perform various functions described below. are discussed in more detail and are related to the navigation of the device 103 (e.g., by controlling the moving mechanism 203). The application 228 can also be implemented as a set of different applications in other examples. The processor 220, when so arranged by the pane lining of the application 228, may also be referred to as a controller 220. As the skilled person appreciates, the functionality implemented by the processor 220 through the implementation of the application 288 may also be implemented by one or more specially designed hardware or firmware components, such as FPGAs, ASICs and the like in other embodiments. The memory 222 may also store a memory module 232 which includes, for example, a map of the environment in which the device 103 functions, for use during the execution of the application 228. The device 103 may further communicate with the server 101, for example to receive instructions to navigate to specified locations (e.g. to the end of a particular path that includes a set of modules 110) and initiate data capture operations (e.g. to traverse the above path during image and / or depth data capture), via a communication interface 224 over the link 107 shown in FIG. 1. The communication interface 224 also allows the device 103 to communicate with the server 101 through the dock 108 and the connection 109. BE2019 / 5216 In the present example, as discussed below, the device 103 is arranged (through the execution of the application 228 by the processor 220) to maintain a location that represents a location of the device 103 within a frame of reference, such as (but not necessarily limited) tot) the global frame 102. Maintaining an updated localization allows the device 103 to generate commands to operate the moving mechanism 203 to move to other locations, such as a path specified in an instruction received from the server 101. As will be apparent to those skilled in the art, localization based on inertial measurement (e.g. via accelerometers and gyroscopes) as well as localization based on odometry (e.g. via a wheel encoder coupled to the moving mechanism) may make errors over time accumulate. The device 103 is therefore arranged, as discussed in more detail below, to update the location data by detecting certain navigation structures within the retail environment. In particular, pad end pieces and shelf faces can be used by the device 103 to update localization data. As will be appreciated in the discussion below, in other examples some or all processing performed by the server 101 may be performed by the device 103, and some or all processing performed by the device 103 may be performed by the server 101. With reference to FIG. 3, before describing the actions taken by the location data updating device 103, certain components of the application 228 are discussed in more detail. As will be apparent to those skilled in the art, in other examples, the components of the application 228 may be separated into different applications, or combined into other sets of components. Some or all of the components shown in FIG. 3 can also be implemented as BE2019 / 5216 equipped hardware components, such as one or more ASICs or FPGAs. The application 228 includes a preprocessor 300 configured to select a primary subset of depth measurements for further processing to locate the device 103. The application 228 also includes an angle generator 304 adapted to detect certain navigation structures on which localization updates are based. In the present example, the generator 304 is referred to as an angle generator because the navigation structure detected by the angle generator 304 is an angle (e.g., a vertical edge) of a shelf module 110, which may also be referred to as an end angle. The application 228 further comprises a shelf plane generator 308 adapted to generate a plane based on the recorded depth data or a subset thereof, comprising the shelf edges 118 within a path comprising a plurality of modules 110. In some examples, the application 228 also includes an image processing processor 312, which is arranged to detect structural features such as the shelf edges 118 from the captured image data (i.e., independent of the captured depth data). The image-based plank edge detections are used by the plank plane generator 308 to validate the generated plank plane. In other examples, the image processing processors 312 are omitted. The application 228 also includes a locator 316 configured to receive one or both of the generated corner edge from the corner generator 304 and a shelf plane from the shelf plane generator 308, and to update the location of the device 103 in at least one frame of reference based on the above mentioned information. As will be shown below, the frame of reference may include the global frame of reference 102 mentioned above, as well as a local frame of reference specific to a particular path of modules 110. The locator 316 may also BE2019 / 5216 include sub-components arranged to generate and output paths along which the device 103 moves (via control of the moving mechanism 203), while updating localization information. The functionality of the application 228 will now be described in more detail, referring to FIG. 4. FIG. 4 shows a method 400 for updating mobile automation device localization, which will be described along with its implementation in the system 100, and in particular by the device 103, with reference to the components illustrated in FIG. 1. At block 405, the device 103, and in particular the preprocessor 300 of the application 228, is arranged to record a plurality of depth measurements, also referred to as depth data. The depth measurements are recorded via the operation of one or more depth sensors of the device 103. In the present example, the depth measurements are recorded via the operation of the depth sensor 209 (i.e. the 3D digital camera) mentioned above. The 3D camera is arranged to capture both depth measurements and color data, referred to herein as image data. That is, each frame captured by the 3D camera is a point cloud, with both color and depth data for each point. The point cloud is typically defined in a frame of reference centered around the sensor 209 itself. In other examples, the image data is omitted and the execution of block 405 includes only the recording of depth data. The device 103 is arranged to execute block 405 in response to the arrival of the device 103 at a specified location in the retail environment. In the present example, the device 103 is configured to receive an instruction from the server 101 prior to executing block 405 to move from a current location of the device 103 to a specific path. The server 101 can, for example, BE2019 / 5216 with reference to FIG. 5, are arranged to instruct (e.g., via connection 107) to device 103 to move from a current location in frame 102 to a path 500 and upon arrival at path 500 to start a data recording the device 103 traverses the length of a plurality of modules 510-1, 510-2 and 510-3 to capture image and / or depth data showing the modules 510. In response to receiving the instruction, the device 103 is arranged (e.g., by executing the locator 316) to generate and output a path from the current location of the device 103 to a location 504 of a terminal corner of the device. path 500. The locations of the modules 510 and thus the location 504 are included in the map stored in the memory module 232. The locator 316 is therefore arranged to call the corner location 504 from the memory module 232, to provide a path to the generate and run location 504. FIG. 6 shows the device 103 following the execution of the above path. In particular, the current location and orientation (i.e. the current position) of the device 103 are shown with solid lines, while a location 600 of the device 103 (i.e. a location and orientation in the frame 102 as maintained by the locator 316) is shown with dotted lines. As can be seen in FIG. 6, the location of the device 103 detected by the locator 316 is inaccurate. Localization anomalies can arise from a variety of sources and can accumulate over time. Deviation sources include slip of the driving mechanism 203 on the floor of the retail facility, signal noise from inertial sensors and the like. Accumulated localization deviations can reach about 20 centimeters in some examples (as will be appreciated, both larger and smaller deviations are possible). That is, the BE2019 / 5216 localization 600 of the device 103 in the frame 102 may be about 20 cm away from the actual true position of the device 103. Certain tasks, such as the above data recording operation, may require smaller localization deviations. (for example, under about 5 cm). In other words, for data recording operations to produce captured data (e.g., image data showing modules 510) of sufficient quality for later processing, the locator 316 may need to maintain a location sufficiently accurate to ensure that the true position of the device 103 relative to the module 510 for which data is recorded is within about 5 cm of a target position. For example, the target position may be about 75 cm from the module 510, and thus the locator 316 may need to maintain a location that ensures that the true distance between the module 510 and the device 103 remains between about 70 cm and about 80 cm. Therefore, before starting the data recording operation, the device 103 is arranged to update the location stored in the locator 316 via the execution of the method 400, starting with the depth and image data recording at block 405. The execution of block 405 is initiated following the arrival of the device 103 adjacent to the location 504, as shown in FIG. 6. FIG. 7 illustrates a portion of the module 510-3 adjacent to the location 504, following the arrival of the device 103 at the location shown in the plan view of FIG. 6. The module 510-3 includes a pair of support surfaces 717-1 and 717-2 that extend from a plank back 716 to plank edges 718-1 and 718-2. The support surface 717-2 supports products 712 thereon, while the support surface 717-1 does not directly support products 712 itself. Instead, the shelf back supports 716 pins 720 on which additional products 712 are supported. A portion BE2019 / 5216 of a ground surface 724, along which the device 103 moves and corresponding to the XY plane (i.e. with a height of zero on the Z axis of the reference frame 102 shown in FIG. 1) in the reference frame 103, is also shown . FIGS. 8A and 8B illustrate an example of the data recorded at block 405. In particular, FIG. 8A is a set of depth measurements corresponding to the module 510-3, in the form of a point cloud 800, while FIG. 8B shows image data 850. In the present example, the sensor 209 is arranged to record depth and image data substantially simultaneously, and the depth and image data is stored in a single file (for example, each point in the point cloud 800 also includes color data corresponding to the image data 850). The depth data 800 and the image data 850 are therefore shown separately for illustrative purposes in FIGs. 8A and 8B. Returning to FIG. 4, at block 410, the preprocessor 300 is arranged to select a primary subset of the depth data included at block 405. The primary subset of depth measurements is selected to reduce the volume of depth measurements to be processed in the rest of the method. 400, while the primary subset includes structural properties on which the device 103 is arranged to base localization updates. In the present example, the primary subset was selected at block 410 by selecting depth measurements within a predefined threshold distance from the sensor 209 (i.e., without depth measurements at a greater distance from the sensor than the threshold). More specifically, in the present example, the preprocessor 300 is arranged to select the primary subset by selecting depth measurements from the point cloud 800 that fall within a primary selection area, such as a cylindrical area of predefined dimensions and position relative to the sensor 209. FIG. 9a illustrates one BE2019 / 5216 example cylindrical selection area 900 is centered on the location 904 of the sensor 209, which is typically the origin of the frame of reference in which the point cloud 800 is contained. Region 900 has a predefined diameter sufficient to include the angle of the end piece module 510-3 despite the potentially inaccurate location 600 of the device 103 shown in FIG. 6. The area 900 also has a base located at a predefined height relative to the sensor 209 (for example, to place the base of the area 900 about 2 cm above the ground surface 724). The area 900 also has a predefined height (i.e. a distance from the base to the top of the cylinder) selected to substantially include the entire height of the modules 510 (e.g. about 2 meters). In some examples, at block 410, the preprocessor is also arranged to select a base plane subset of depth measurements, for example, by using a pass filter to select only points within a predefined distance from the XY plane in the reference frame 102 (e.g., above a height of -2cm and under a height of 2cm). The ground plane subset can be used to generate a ground plane (e.g., using an appropriate plane fit operation) for use in validating subsequent processing runs of the method 400, as will be discussed below. Returning to FIG. 4, at block 415, the angle generator 304 is configured to select an angle candidate subset of depth measurements from the primary depth data subset and to generate a corner edge from the angle candidate subset. The embodiment of block 415 serves to further constrain the collection of depth measurements within which the end piece angle of the module 510-3 is contained. With reference to FIG. 9B, the angle generator 304 is arranged to select the angle candidate subset, in the present example, BE2019 / 5216 by identifying the depth reading within the primary subset closest to the sensor location 904. In particular, FIG. 9B is a top plan view of the primary depth measurement subset. The primary subset is shown as a wedge rather than an entire cylinder because the sensor 209 has a field of view of less than 360 degrees (e.g., about 130 degrees in the example shown). As can be seen in FIG. 9B, only a subset of the depth measurements (the primary subset referred to above) is shown in the point cloud 800. In particular, no depth measurements corresponding to the ground surface 724 are present in the primary subset. The angle generator 304 is arranged to identify the point 908 in the primary subset as the point closest to the location 904 (i.e. the location of the sensor 209). The point 908 is assumed to correspond to a portion of the tailpiece angle of the module 510-3. The angle generator is therefore arranged to select the above-mentioned angle candidate subset in response to identifying the point 908 by generating an angle candidate selection area based on the point 908. In the present example, the angle candidate selection area is a further cylinder with a smaller predefined diameter. than the cylinder 900 mentioned earlier and with a longitudinal axis including point 908. An example candidate angle selection area 912 is shown in FIG. 9A. The area 912 can be positioned at the same height (for example, 2cm above the ground surface 724) as the area 900 and can be the same height as the area 900. After the corner candidate selection area 912 is selected, the corner generator 304 is arranged to fit an edge (i.e., a line) to the points included in the area 912. Referring to FIG. 9C, it becomes region 912 and the corner candidate subset of it BE2019 / 5216 depth measurements included therein shown in isolation. A corner edge 916 is also shown in FIG. 9C, which is fitted to the points of the corner candidate subset. The corner edge 916 is generated in accordance with a suitable line-fit operation, such as a random sample consensus (RANSAC) line-fit operation. Limitations can also be applied to the line fit operation. For example, the corner generator 304 can be arranged to fit a substantially vertical line at the points of the corner candidate subset by imposing a limitation that the resulting corner edge 916 is substantially perpendicular to the above ground plane. Returning to FIG. 4, at block 420, in response to generating the corner edge 916, the corner generator 304 is arranged to select a pad subset of depth measurements from the primary subset (shown in FIG. 9B), based on the corner edge 916. In particular, , with reference to FIG. 9D, a pad subset 924 is selected from the primary subset, excluding a remainder 928 of the primary subset, by selecting only the depth measurements of the primary subset that are on a predefined side of the corner edge 916 relative to the center location 904. The corner generator 304 is arranged, for example, to divide the primary subset with a plane 920 extending through corner edge 916 and intersecting center 904. The path subset 924 is the subset of points on the side of the plane 920 that corresponds to the interior of the path 500. In other examples, at block 420, the corner generator 304 is also arranged to select an end piece subset, corresponding to the remainder 928 of the primary subset shown in FIG. 9D. As will now be understood, the end piece subset is believed to include the edges 718 extending perpendicular to the path 500. BE2019 / 5216 At block 425, the plank plane generator 308 is arranged to generate local minima from the pad subset for use in the generation of a plank plane at block 430. More specifically, with reference to FIG. 10A, in the present example, the shelf plane generator 308 is arranged to generate a plurality of sample planes 1000-1, 1000-2, 100-3, etc. extending from the center location 904 at predefined angles through the pad subset of depth measurements. For each sample plane 1000, depth measurements are projected onto the sample plane within a threshold distance from the sample plane 1000. A plurality of depth measurements 1004 are shown in FIG. 10A as being within the above-mentioned threshold distance from the faces 100. Further, as shown in FIG. 10B, for each sample plane, a few of the measurements 1004 selected closest to the location 904 are selected. Thus, three local minimum points 1008-1, 1008-2 and 1008-3 that are selected in FIG. 10B, discarding the remaining points in the pad subset. The plank plane generator 304 is then arranged to generate a plank plane for the pad 500 at block 430, by performing an appropriate plane-fit operation (e.g., a RANSAC operation) on the local minimina selected at block 425. FIG. 10C illustrates the result of such a planar fit in the form of a plank or pad plane 1012 (the local minima 1008 mentioned above are also shown for illustrative purposes). The path plane generation at block 430 may include one or more validation operations. For example, restrictions can be imposed on the plane-fit operation, such as a requirement that the resulting pad plane be substantially transverse to the aforementioned ground plane. In some examples, restrictions for use at block 430 may be generated from the image data 850 (i.e., independently of the depth measurements 800). In particular, it is in some BE2019 / 5216 examples the preprocessor 300 arranged to perform block 435 following block data recording at block 405. At block 435, the preprocessor 300 is arranged to generate one or more shelf edges from the image data 850 according to an appropriate edge detection operation. An example of the above edge detection operation includes the conversion of the image data 850 to gray-scale image data and optionally downsampling the image data 850. The preprocessor 300 can then be arranged to apply, for example, a Sobel filter to the image data to extract gradients (e.g. vertical gradients denoting horizontal edges) from the image data. The preprocessor 300 can then be arranged to apply a Hough transform to the resulting gradients to generate candidate edge lines. As will be appreciated, other shelf edge detection operations can also be used at block 435, such as a Canny edge detector. After shelf edges have been generated (e.g. corresponding to the shelf edges 718-1 and 718-2 shown in FIG. 7), the preprocessor 300 can be arranged to recover the positions (in the point cloud 800) of pixels in the image data 850 lying on the plank edges. The above positions are then used at block 430 to validate the pad face generated by the plank face generator 308. For example, the plank face generator 308 may be arranged to verify that the pad face 1012 includes the points that lie on the plank edges, or that such points lie within a threshold distance from the path plane 102. In other examples, the preprocessor 300 is arranged to fit a validation plane to the plank edge points and the plank plane generator 308 is configured to apply the validation plane as a constraint during the generation of the path plane 1012 (for example, as a require the path plane 1012 to have an angle with the validation plane not greater than a predetermined threshold). In further examples, the preprocessor 300 may be configured to validate the path plane by determining whether angles are between the BE2019 / 5216 plank edges themselves (for example the candidate plank line mentioned above) and the pad plane 1012 exceed a threshold angle. Returning to FIG. 4, at block 440, the locator 316 is arranged to update the location of the device 103 in accordance with the corner edge 916 and the path plane 102. As will now be apparent, the position and orientation of the device 103 relative to the corner edge 916 and the path plane 1012 is determined from the point cloud 800 without being subject to certain deviation sources (e.g., inertial sensor drift, wheel slip, and the like) that are responsible for a portion of the deviation between the previous location 600 and the true position of the device 103. Therefore, updating the location of the device at block 440 in the present example includes initiating a local reference frame having an origin at the intersection between the corner edge 916, the path plane 1012 and the above ground plane. FIG. 10C illustrates a local reference frame 1016 in which the path plane 1012 is the X-Z plane and the ground plane is the X-Y plane. The locator 316 can therefore be arranged to determine a position of the device 103 in the reference frame 1016. In further examples, the locator 316 is arranged to update the location of the device 103 by retrieving (e.g., the map in the memory module 232 ) of a predefined true location of the end piece angle of the module 510-3 in the global reference frame 102. The position and orientation of the device 103 can then be determined in the global reference frame 102 with the true location of the end piece angle of the module 510-3 and the position and orientation of the device 103 relative to the corner edge 916 and pad surface 1012. With regard to FIG. 11, the previous location 600 is shown next to the true position of the device 103 and an updated location 1100 obtained through the implementation of the method 400. The updated location can also be arranged to filter a Kalman filter. BE2019 / 5216 initializing or updating adapted to accept inertial sensor data, wheel odometry, lidar odometry and the like as inputs and to generate attitude estimates for the device 103. Following the completion of the method 400, the device 103 is arranged to cross the path 500 according to the aforementioned data recording instruction (received from the server 101). As will be appreciated, additional deviation during the crossing may accumulate in the localization obtained at block 440. The device 103 is therefore arranged to repeat the localization update process detailed above in relation to FIG. 4 with certain differences listed below. FIG. 12 illustrates a method 1200 for updating location during movement through a path (e.g. path 500). The method 1200 can therefore be initiated following an implementation of the method 400 at an entrance to the path 500, as discussed above. Implementation of the method 1200 includes recording depth and (optional) image data at block 1205, selecting a primary subset of the depth measurements at block 1210, and selecting local minima from the primary subset at block 1225. The execution of blocks 1205 1210 and 1225 are as described above in relation to blocks 405, 410 and 425, respectively. As will now be apparent, the detection of an angle through the generation of an angled edge is omitted in FIG. 12. The local minima selected at block 1225 are therefore selected from the entire primary subset rather than a portion of the primary subset as illustrated in FIG. 9D. Following the selection of local minima at block 1225, the device 103 (and in particular the shelf plane generator 308) is arranged to generate a posture filter plane and select a pad subset of depth measurements based on the posture filter plane. With BE2019 / 5216 relating to FIG. 13, an exemplary embodiment of block 1227 is discussed. FIG. 13 shows the true position of the device 103 in solid lines and the current location 1300 of the device 103. As will be appreciated, a certain amount of deviation has accumulated in the location 1300. FIG. 13 also shows a plurality of local minimum points 1304 obtained through the execution of block 1225. Certain local minima may represent sensor noise, or depth measurements, corresponding to products 712 on the shelf support surfaces 717. Therefore, the shelf plane generator 308 is arranged to generate a posture filter plane 1308. and to select a path subset of the points 1034, which includes the subset of the points 1304 located between the posture filter plane 1308 and a posture plane 1312 corresponding to the current (according to the location 1300) attitude of the device 103. The position of the path posture filter plane 1308 is set corresponding to a distance 1316 from the posture plane 1312. The distance 1316 can be predefined or can be determined as a multiple (typically greater than one) of a distance 1320 between the nearest point in the primary subset and the posture plane 1312. The factor itself can also be vo or can be determined dynamically based on the orientation angle of the device 103 relative to the X axis of the local reference frame 1016. For example, the factor can be set to increase as the orientation angle diverges from a zero degree angle. After the posture filter plane 1308 is generated and the pad subset of points at block 1227 is selected, the plank plane generator 308 is arranged to generate a plank plane (also referred to herein as a path plane, as noted previously) at block 1230 based on the pad subset of the depth measurements. Block 1230 is as described above in relation to block 430 BE2019 / 5216 and may include the use of image-derived plank edges of block 1235 (which is as described in relation to block 435). Referring again to FIG. 13, two candidate path planes 1324 and 1328 are shown. At block 1232, the plank plane generator is arranged to select one of planes 1324 and 1328 and determine whether the angle of the selected plane relative to the posture filter plane 1308 (or the posture plane 1312, if the planes 1308 and 1312 are parallel to each other) exceeds a predetermined threshold. The determination at block 1232 reflects an assumption that although the localization 1300 may include some degree of deviation, that deviation is not unlimited and therefore certain planar angles are unlikely to correspond to true plank planes. More specifically, the device 103 is adapted to cross the path 500, remaining substantially parallel to the shelf edges 718 of the modules 510. Therefore, it is unlikely that a face generated at block 1230 indicating that the device 103 has deviated from the parallel orientation mentioned above is a properly fit face above a threshold. For example, the angle threshold can be about ten degrees. Therefore, in the present example, the determination at block 1232 is affirmative for the plane 1324, and the execution of the method 1200 therefore proceeds to the block 1233 to determine whether any planes remain to be evaluated. If the determination is negative, the execution of the method 1200 starts again at block 1205. When additional faces remain to be evaluated, the execution of block 1232 is repeated for the next face (in the present example, face 1238). As is evident from FIG. 13, the plane 1328 is substantially parallel to the attitude plane 1312, and therefore the determination at block 1232 is negative. The plane 1328 is therefore selected as the path plane and the locator 316 is arranged to update the location of the device 103 based on the path plane 1328. BE2019 / 5216 As will now be appreciated, the path plane 1328 represents the detected location of the XZ plane of the reference frame 1016. Therefore, at block 1240, the locator 316 can be arranged to update the perceived orientation of the device 103 relative to the XZ plane based on of the orientation of the path plane 1328 in the point cloud included at block 1205. FIG. 14 illustrates an updated localization 1400 generated at block 1240, with the orientation corrected relative to localization 1300. As noted above in relation to block 440, the locator 316 can also be set to update the Kalman filter with the updated localization 1400. Returning to FIG. 12, at block 1245, the device 103 is arranged to determine whether the path 500 has been completely crossed based on the updated location. The determination at block 1245 can be based on the local frame 1016 or the global frame 102, since the length of the path 500 is known from the map. When the determination at block 1245 is negative, the execution of the method 1200 is repeated as the device 103 continues to cross the path 500. When the determination at block 1245 is affirmative, the execution of the method 1200 ends. Specific embodiments have been described in the foregoing description. It is noted, however, that various modifications and changes can be made without departing from the scope of the invention as set out in the claims below. Accordingly, the description and figures are to be considered in an illustrative rather than a limiting sense, and all such adaptations are intended to be included within the scope of the present teachings. The benefits, solutions to problems, and any element that could lead to any benefit, or solution that occurs or becomes more pronounced, should not be interpreted as critical, BE2019 / 5216 required, or essential measure or element of one or all of the claims. The invention is defined only by the appended claims including modifications made while the pending of this application and all equivalents of those granted claims. For the purpose of clarity and a brief description, features herein are described as part of the same or separate embodiments. It should be noted, however, that the scope of the invention may include embodiments having combinations of all or some of the features described herein. It can be assumed that the embodiments shown include similar or equivalent components, except where described as otherwise. In addition, in this document, relative terms such as first and second, top and bottom, and the like can only be used to distinguish one entity or action from another entity or action without necessarily necessitating any of these factual relationships or orders between such entities or actions is or is implied. The terms "includes", "comprising", "" has "," having "," contains "," containing "or any other variation thereof, are intended to cover a non-exclusive inclusion such that a process, method, article, or device that includes a list of elements, includes not only those elements, but may also include other elements not expressly stated or inherent in such a process, method, article, or device. An element preceded by "includes ... a", "has ... a", "contains ... a" does not, without limitation, exclude the existence of additional identical elements in the process, method, article or has the device comprising the element. The term “one” is defined as one or more, unless explicitly stated otherwise herein. The terms "substantial", "essential", "approximately", or any other version thereof, are defined as being close by, as understood BE2019 / 5216 by those skilled in the art, and in one non-limiting embodiment, the term is defined as being within 10%, in another embodiment as being within 5%, in another embodiment as being within 1%, and in another embodiment as being within 0.5%. The term "linked" as used herein is defined as connected, although not necessarily direct and not necessarily mechanical. A device or structure that is "configured" in some way is configured in at least that way, but it can also be configured in ways not indicated. Note that some embodiments may include one or more generic or specialized processors (or "processing devices"), such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs) and uniquely stored program instructions (including both software and hardware) controlling the one or more processors, in combination with certain non-processor circuits, to implement some, most, or all of the functions of the method and / or device described herein. Alternatively, some or all of the functions could be implemented by a state machine that has no stored program instructions, or in one or more application-specific integrated circuits (ASICs), in which each function or some combinations of certain functions are implemented as custom logic (on custom made). Obviously, a combination of the two approaches could be used. In addition, an embodiment can be implemented as a computer readable storage medium which has stored thereon a computer readable code for programming a computer (e.g., including a processor) to perform a method as described and claimed herein. Examples of such computer readable storage media include, but are not limited to, BE2019 / 5216 a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (read-only memory), a PROM (programmable read-only memory), an EPROM (erasable programmable read-only memory), an EEPROM (electrically erasable programmable read-only memory) and a flash memory. It is further noted that despite potentially significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles described herein, one will easily be able to generate such software instructions and programs and ICs with minimal experimentation. The summary of the description is provided to enable the reader to quickly understand the nature of the technical description. It is submitted on the assumption that it will not be used to interpret the claims or to limit their scope of protection. Additionally, in the foregoing comprehensive description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the description. This manner of description is not to be interpreted as reflecting an intention that the embodiments claimed require more measures than are expressly enumerated in each claim. Rather, as the following claims express, the subject matter of the invention lies in less than all features of a single described embodiment. So the following conclusions have been incorporated into the detailed description, with each conclusion standing on its own as separate claimed matter. The mere fact that certain measures are mentioned in mutually different claims does not indicate that a combination of these measures cannot be used for an advantage. A large number of variants will be clear to the skilled person. All the variants are 35 BE2019 / 5216 deemed to be encompassed within the scope of the invention as defined in the following claims.
权利要求:
Claims (20) [1] Conclusions A method for mobile automation device localization in a navigation controller, the method comprising: controlling a depth sensor to record a plurality of depth measurements corresponding to an area comprising a navigation structure; selecting a primary subset of the depth measurements; selecting, from the primary subset, an angle candidate subset of the depth measurements; generating, from the corner candidate subset, a corner edge corresponding to the navigation structure; selecting a path subset of the depth measurements from the primary subset, corresponding to the angular edge; selecting, from the pad subset, a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generating a plank plane from the local minimum depth measurements; and updating a location of the mobile automation device based on the corner edge and the shelf plane. [2] The method of claim 1, further comprising, prior to recording the depth measurements: receiving an instruction to cross a path associated with the navigation structure; obtaining a location of the navigation structure in a global frame of reference; and controlling a moving mechanism from the mobile automation device to move to the site. BE2019 / 5216 [3] The method of claim 1 or 2, wherein selecting the primary subset comprises generating a primary selection area centered on the depth sensor and selecting the depth measurements within the primary selection area. [4] The method of claim 3, wherein the primary selection area is a cylinder. [5] The method of any preceding claim, wherein selecting the pad subset comprises dividing the primary subset into two sections corresponding to the corner edge, and selecting one of the sections. [6] The method of any preceding claim, wherein updating the localization comprises initializing a local reference frame that has an origin based on the corner edge and the plank plane. [7] The method of any preceding claim, further comprising providing the updated localization to a Kalman filter. [8] The method of any of the preceding claims, further comprising: recording image data with the depth measurements; detecting a shelf edge in the image data; and validating the shelf face according to the shelf edge. [9] The method of claim 2, further comprising: initiating a path crossing; controlling the depth sensor to record a further multiple number of depth measurements; selecting a further primary subset of depth measurements from the further multiple number of depth measurements; selecting a further path subset of the depth measurements from the further primary subset; BE2019 / 5216 generating a further plank plane based on the further path subset; and further updating the localization based on the further plank plane. [10] The method of claim 9, further comprising: determining an angle of the further plank surface with respect to an attitude plane of the mobile automation device; and rejecting the further plank surface if the angle exceeds a threshold. [11] Calculator for mobile automation device localization, comprising: a depth sensor, a navigation controller designed to: controlling the depth sensor to record a plurality of depth measurements corresponding to an area comprising a navigation structure; select a primary subset of the depth measurements; select from the primary subset, an angle candidate subset of the depth measurements; generate a corner edge corresponding to the navigation structure from the corner candidate subset; select a path subset of the depth measurements from the primary subset, corresponding to the angular edge; selecting from the pad subset a local minimum depth measurement for each of a plurality of sample planes extending from the depth sensor; generate a plank plane from the local minimum depth measurements; and BE2019 / 5216 update a location of the mobile automation device based on the corner edge and the shelf surface. [12] The computing device of claim 11, wherein the navigation controller is further adapted to, prior to controlling the depth sensor to record the depth readings: receive an instruction to cross a path associated with the navigation structure; obtain a location of the navigation structure in a global frame of reference; and controlling a moving mechanism from the mobile automation device to move to the site. [13] Calculator according to claim 11 or 12, wherein the navigation controller is further arranged to select the primary subset by: generate a primary selection area centered on the depth sensor; and selecting the depth measurements within the primary selection area. [14] Accounting direction according to claim 13, wherein the primary selection area is a cylinder. [15] The computing device according to any of the preceding claims 11-14, wherein the navigational controller is further adapted to select the pad subset by dividing the primary subset into two sections corresponding to the corner edge, and selecting one of the sections. [16] The computing device according to any of the preceding claims 11-15, wherein the navigation controller is further adapted to update the localization by initializing a local reference frame having an origin based on the corner edge and the shelf plane. [17] Calculator according to any of the preceding claims 11-116, wherein the navigation controller is further arranged to provide the updated localization to a Kalman filter. BE2019 / 5216 [18] Calculator as claimed in any of the foregoing claims 11-17, wherein the navigation controller is further arranged to: controlling the image sensor to record image data with the depth measurements; detect a shelf edge in the image data; and validate the shelf face in accordance with the shelf edge. [19] Calculator according to claim 12, wherein the navigation controller is further arranged to: initiate a crossing of the path; control the depth sensor to record a further multiple number of depth readings; selecting a further primary subset of depth measurements from the further multiple number of depth measurements; selecting a further path subset of the depth measurements from the further primary subset; generate a further plank plane based on the further path subset; and further update the localization based on the further plank plane. [20] The computing device of claim 19, wherein the navigation controller is further adapted to: determine an angle of the further plank surface with respect to an attitude plane of the mobile automation device; and reject the further plank surface if the angle exceeds a threshold.
类似技术:
公开号 | 公开日 | 专利标题 AU2018220103B2|2020-03-05|Method and apparatus for support surface edge detection BE1026161B1|2020-07-10|METHOD, SYSTEM AND DEVICE FOR MOBILE AUTOMATION DEVICE LOCALIZATION BE1025892A9|2019-08-07|METHOD FOR SCREEN EDGE DETECTION US20180313956A1|2018-11-01|Device and method for merging lidar data US10521914B2|2019-12-31|Multi-sensor object recognition system and method US10832436B2|2020-11-10|Method, system and apparatus for recovering label positions US20210272316A1|2021-09-02|Method, System and Apparatus for Object Detection in Point Clouds US10731970B2|2020-08-04|Method, system and apparatus for support structure detection US11151743B2|2021-10-19|Method, system and apparatus for end of aisle detection US11079240B2|2021-08-03|Method, system and apparatus for adaptive particle filter localization US11042161B2|2021-06-22|Navigation control method and apparatus in a mobile automation system AU2019396253A1|2021-04-29|Method, system and apparatus for auxiliary label detection and association BE1027283B1|2021-02-25|METHOD, SYSTEM AND DEVICE FOR DETECTING PRODUCT VIEWS US20200182623A1|2020-06-11|Method, system and apparatus for dynamic target feature mapping US20200183408A1|2020-06-11|Method and apparatus for navigational ray tracing US20210173405A1|2021-06-10|Method, System and Apparatus for Localization-Based Historical Obstacle Handling US11158075B2|2021-10-26|Method, system and apparatus for depth sensor artifact removal US20200379477A1|2020-12-03|Method, System and Apparatus for Mitigating Data Capture Light Leakage US11015938B2|2021-05-25|Method, system and apparatus for navigational assistance US20200380317A1|2020-12-03|Method, System and Apparatus for Gap Detection in Support Structures with Peg Regions CA3113791A1|2020-06-18|Method and apparatus for control of mobile automation apparatus light emitters WO2020247067A1|2020-12-10|Method, system and apparatus for shelf edge detection
同族专利:
公开号 | 公开日 GB2586405A|2021-02-17| CA3095925A1|2019-10-10| GB202015594D0|2020-11-18| BE1026161A9|2019-10-29| WO2019195595A1|2019-10-10| BE1026161A1|2019-10-22| DE112019001796T5|2021-02-11| US20190310652A1|2019-10-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20150363625A1|2014-06-13|2015-12-17|Xerox Corporation|Image processing methods and systems for barcode and/or product label recognition| US20170286901A1|2016-03-29|2017-10-05|Bossa Nova Robotics Ip, Inc.|System and Method for Locating, Identifying and Counting Items| US20150170256A1|2008-06-05|2015-06-18|Aisle411, Inc.|Systems and Methods for Presenting Information Associated With a Three-Dimensional Location on a Two-Dimensional Display| JP5905031B2|2011-01-28|2016-04-20|インタッチ テクノロジーズ インコーポレイテッド|Interfacing with mobile telepresence robot| TWI594933B|2013-03-15|2017-08-11|辛波提克有限責任公司|Automated storage and retrieval system| US9565400B1|2013-12-20|2017-02-07|Amazon Technologies, Inc.|Automatic imaging device selection for video analytics|US10731970B2|2018-12-13|2020-08-04|Zebra Technologies Corporation|Method, system and apparatus for support structure detection| US11151743B2|2019-06-03|2021-10-19|Zebra Technologies Corporation|Method, system and apparatus for end of aisle detection|
法律状态:
2020-08-26| FG| Patent granted|Effective date: 20200710 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/946,412|US20190310652A1|2018-04-05|2018-04-05|Method, system and apparatus for mobile automation apparatus localization| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|